# Sequence-to-sequence pre-training
Bartpho Word
MIT
BARTpho is the first pre-trained sequence-to-sequence model for Vietnamese, including syllable and word versions, suitable for generative natural language processing tasks.
Large Language Model
Transformers

B
vinai
3,584
6
Mbarthez
Apache-2.0
BARThez is a French sequence-to-sequence pre-trained model based on the BART architecture, particularly suitable for generative tasks such as abstractive summarization.
Large Language Model
Transformers French

M
moussaKam
1,032
6
Bartpho Syllable
MIT
BARTpho is the first large-scale monolingual sequence-to-sequence pre-trained model for Vietnamese, based on the BART architecture, particularly suitable for generative natural language processing tasks.
Large Language Model
Transformers

B
vinai
3,826
6
Featured Recommended AI Models